16,224 research outputs found
Unifying Parsimonious Tree Reconciliation
Evolution is a process that is influenced by various environmental factors,
e.g. the interactions between different species, genes, and biogeographical
properties. Hence, it is interesting to study the combined evolutionary history
of multiple species, their genes, and the environment they live in. A common
approach to address this research problem is to describe each individual
evolution as a phylogenetic tree and construct a tree reconciliation which is
parsimonious with respect to a given event model. Unfortunately, most of the
previous approaches are designed only either for host-parasite systems, for
gene tree/species tree reconciliation, or biogeography. Hence, a method is
desirable, which addresses the general problem of mapping phylogenetic trees
and covering all varieties of coevolving systems, including e.g., predator-prey
and symbiotic relationships. To overcome this gap, we introduce a generalized
cophylogenetic event model considering the combinatorial complete set of local
coevolutionary events. We give a dynamic programming based heuristic for
solving the maximum parsimony reconciliation problem in time O(n^2), for two
phylogenies each with at most n leaves. Furthermore, we present an exact
branch-and-bound algorithm which uses the results from the dynamic programming
heuristic for discarding partial reconciliations. The approach has been
implemented as a Java application which is freely available from
http://pacosy.informatik.uni-leipzig.de/coresym.Comment: Peer-reviewed and presented as part of the 13th Workshop on
Algorithms in Bioinformatics (WABI2013
Biodiversity informatics: the challenge of linking data and the role of shared identifiers
A major challenge facing biodiversity informatics is integrating data stored in widely distributed databases. Initial efforts have relied on taxonomic names as the shared identifier linking records in different databases. However, taxonomic names have limitations as identifiers, being neither stable nor globally unique, and the pace of molecular taxonomic and phylogenetic research means that a lot of information in public sequence databases is not linked to formal taxonomic names. This review explores the use of other identifiers, such as specimen codes and GenBank accession numbers, to link otherwise disconnected facts in different databases. The structure of these links can also be exploited using the PageRank algorithm to rank the results of searches on biodiversity databases. The key to rich integration is a commitment to deploy and reuse globally unique, shared identifiers (such as DOIs and LSIDs), and the implementation of services that link those identifiers
Prospects of Detecting Baryon and Quark Superfluidity from Cooling Neutron Stars
Baryon and quark superfluidity in the cooling of neutron stars are
investigated. Observations could constrain combinations of the neutron or
Lambda-hyperon pairing gaps and the star's mass. However, in a hybrid star with
a mixed phase of hadrons and quarks, quark gaps larger than a few tenths of an
MeV render quark matter virtually invisible for cooling. If the quark gap is
smaller, quark superfluidity could be important, but its effects will be nearly
impossible to distinguish from those of other baryonic constituents.Comment: 4 pages, 3 ps figures, uses RevTex(aps,prl). Submitted to Phys. Rev.
Let
Agnesi Weighting for the Measure Problem of Cosmology
The measure problem of cosmology is how to assign normalized probabilities to
observations in a universe so large that it may have many observations
occurring at many different spacetime locations. I have previously shown how
the Boltzmann brain problem (that observations arising from thermal or quantum
fluctuations may dominate over ordinary observations if the universe expands
sufficiently and/or lasts long enough) may be ameliorated by volume averaging,
but that still leaves problems if the universe lasts too long. Here a solution
is proposed for that residual problem by a simple weighting factor 1/(1+t^2) to
make the time integral convergent. The resulting Agnesi measure appears to
avoid problems other measures may have with vacua of zero or negative
cosmological constant.Comment: 26 pages, LaTeX; discussion is added of how Agnesi weighting appears
better than other recent measure
Cosmological Measures without Volume Weighting
Many cosmologists (myself included) have advocated volume weighting for the
cosmological measure problem, weighting spatial hypersurfaces by their volume.
However, this often leads to the Boltzmann brain problem, that almost all
observations would be by momentary Boltzmann brains that arise very briefly as
quantum fluctuations in the late universe when it has expanded to a huge size,
so that our observations (too ordered for Boltzmann brains) would be highly
atypical and unlikely. Here it is suggested that volume weighting may be a
mistake. Volume averaging is advocated as an alternative. One consequence may
be a loss of the argument that eternal inflation gives a nonzero probability
that our universe now has infinite volume.Comment: 15 pages, LaTeX, added references for constant-H hypersurfaces and
also an idea for minimal-flux hypersurface
A window into the neutron star: Modelling the cooling of accretion heated neutron star crusts
In accreting neutron star X-ray transients, the neutron star crust can be
substantially heated out of thermal equilibrium with the core during an
accretion outburst. The observed subsequent cooling in quiescence (when
accretion has halted) offers a unique opportunity to study the structure and
thermal properties of the crust. Initially crust cooling modelling studies
focussed on transient X-ray binaries with prolonged accretion outbursts (> 1
year) such that the crust would be significantly heated for the cooling to be
detectable. Here we present the results of applying a theoretical model to the
observed cooling curve after a short accretion outburst of only ~10 weeks. In
our study we use the 2010 outburst of the transiently accreting 11 Hz X-ray
pulsar in the globular cluster Terzan 5. Observationally it was found that the
crust in this source was still hot more than 4 years after the end of its short
accretion outburst. From our modelling we found that such a long-lived hot
crust implies some unusual crustal properties such as a very low thermal
conductivity (> 10 times lower than determined for the other crust cooling
sources). In addition, we present our preliminary results of the modelling of
the ongoing cooling of the neutron star in MXB 1659-298. This transient X-ray
source went back into quiescence in March 2017 after an accretion phase of ~1.8
years. We compare our predictions for the cooling curve after this outburst
with the cooling curve of the same source obtained after its previous outburst
which ended in 2001.Comment: 4 pages, 1 figure, to appear in the proceedings of "IAUS 337: Pulsar
Astrophysics - The Next 50 Years" eds: P. Weltevrede, B.B.P. Perera, L. Levin
Preston & S. Sanida
Going nuclear: gene family evolution and vertebrate phylogeny reconciled
Gene duplications have been common throughout vertebrate evolution, introducing paralogy and so complicating phylogenctic inference from nuclear genes. Reconciled trees are one method capable of dealing with paralogy, using the relationship between a gene phylogeny and the phylogeny of the organisms containing those genes to identify gene duplication events. This allows us to infer phylogenies from gene families containing both orthologous and paralogous copies. Vertebrate phylogeny is well understood from morphological and palaeontological data, but studies using mitochondrial sequence data have failed to reproduce this classical view. Reconciled tree analysis of a database of 118 vertebrate gene families supports a largely classical vertebrate phylogeny
Opaque or transparent? A link between neutrino optical depths and the characteristic duration of short gamma-ray bursts
Cosmological gamma ray bursts (GRBs) are thought to occur from violent
hypercritical accretion onto stellar mass black holes, either following core
collapse in massive stars or compact binary mergers. This dichotomy may be
reflected in the two classes of bursts having different durations. Dynamical
calculations of the evolution of these systems are essential if one is to
establish characteristic, relevant timescales. We show here for the first time
the result of dynamical simulations, lasting approximately one second, of
post--merger accretion disks around black holes, using a realistic equation of
state and considering neutrino emission processes. We find that the inclusion
of neutrino optical depth effects produces important qualitative temporal and
spatial transitions in the evolution and structure of the disk, which may
directly reflect upon the duration and variability of short GRBs.Comment: Accepted for publication in ApJ Letter
What Have We Learned from Policy Transfer Research? Dolowitz and Marsh Revisited
Over the last decade, policy transfer has emerged as an important concept within public policy analysis, guiding both theoretical and empirical research spanning many venues and issue areas. Using Dolowitz and Marsh's 1996 stocktake as its starting point, this article reviews what has been learned by whom and for what purpose. It finds that the literature has evolved from its rather narrow, state-centred roots to cover many more actors and venues. While policy transfer still represents a niche topic for some researchers, an increasing number have successfully assimilated it into wider debates on topics such as globalisation, Europeanisation and policy innovation. This article assesses the concept's position in the overall ‘tool-kit’ of policy analysis, examines some possible future directions and reflects on their associated risks and opportunities
Transient Observers and Variable Constants, or Repelling the Invasion of the Boltzmann's Brains
If the universe expands exponentially without end, ``ordinary observers''
like ourselves may be vastly outnumbered by ``Boltzmann's brains,'' transient
observers who briefly flicker into existence as a result of quantum or thermal
fluctuations. One might then wonder why we are so atypical. I show that tiny
changes in physics--for instance, extremely slow variations of fundamental
constants--can drastically change this result, and argue that one should be
wary of conclusions that rely on exact knowledge of the laws of physics in the
very distant future.Comment: 4 pages, LaTeX; v2: added references; v3: more discussion of setting,
alternative approaches, now 5 pages; v4: added discussion of the effect of
quantum fluctuations on varying constants, appendix added, now 7 pages; v5:
new reference, minor correctio
- …